Sequences of Inequalities Among New Divergence Measures
نویسنده
چکیده
Inder Jeet Taneja Departamento de Matemática Universidade Federal de Santa Catarina 88.040-900 Florianópolis, SC, Brazil. e-mail: [email protected] http://www.mtm.ufsc.br/∼taneja Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [5, 6] J-divergence. Sibson-BurbeaRao [1] Jensen-Shannon divegernce and Taneja [9] arithemtic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discrimination, symmetric χ2−divergence, and triangular discrimination are also known in the literature and are not based on logarithmic expressions. Past years Dragomir et al. [3], Kumar and Johnson [7] and Jain and Srivastava [4] studied different kind of divergence measures. In this paper, we have presented some more new divergence measures and obtained inequalities relating these new measures made connections with previous ones. The idea of exponential divergence is also introduced.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملInequalities among Differences of Gini Means and Divergence Measures
In 1938, Gini [3] studied a mean having two parameters. Later, many authors studied properties of this mean. In particular, it contains the famous means as harmonic, geometric, arithmetic, etc. Here we considered a sequence of inequalities arising due to particular values of each parameter of Gini’s mean. This sequence generates many nonnegative differences. Not all of them are convex. We have ...
متن کاملSome Inequalities Among New Divergence Measures
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence m...
متن کاملRefinement Inequalities among Symmetric Divergence Measures
There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...
متن کاملNew Jensen and Ostrowski Type Inequalities for General Lebesgue Integral with Applications
Some new inequalities related to Jensen and Ostrowski inequalities for general Lebesgue integral are obtained. Applications for $f$-divergence measure are provided as well.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010